Meta Accelerates AI Chip Development with Four New MTIA Generations
Meta is aggressively expanding its AI infrastructure with four new generations of its proprietary MTIA chips slated for release within two years. The Meta Training and Inference Accelerator, launched in 2023, will see iterations labeled MTIA 300 through 500, each offering incremental improvements in compute power and memory bandwidth.
The chips are purpose-built for Meta's specific workloads—ranking, recommendations, and generative AI tasks across its platforms. Unlike general-purpose silicon, MTIA forms part of a custom full-stack solution that Meta claims delivers superior efficiency and cost savings for its massive-scale operations.
Already deploying hundreds of thousands of these chips for inference work, Meta is adopting a 'portfolio approach'—blending in-house silicon with third-party processors to maintain flexibility. This strategy avoids vendor lock-in while scaling AI capabilities across content and advertising systems.